26 research outputs found

    Proficiency-aware systems

    Get PDF
    In an increasingly digital world, technological developments such as data-driven algorithms and context-aware applications create opportunities for novel human-computer interaction (HCI). We argue that these systems have the latent potential to stimulate users and encourage personal growth. However, users increasingly rely on the intelligence of interactive systems. Thus, it remains a challenge to design for proficiency awareness, essentially demanding increased user attention whilst preserving user engagement. Designing and implementing systems that allow users to become aware of their own proficiency and encourage them to recognize learning benefits is the primary goal of this research. In this thesis, we introduce the concept of proficiency-aware systems as one solution. In our definition, proficiency-aware systems use estimates of the user's proficiency to tailor the interaction in a domain and facilitate a reflective understanding for this proficiency. We envision that proficiency-aware systems leverage collected data for learning benefit. Here, we see self-reflection as a key for users to become aware of necessary efforts to advance their proficiency. A key challenge for proficiency-aware systems is the fact that users often have a different self-perception of their proficiency. The benefits of personal growth and advancing one's repertoire might not necessarily be apparent to users, alienating them, and possibly leading to abandoning the system. To tackle this challenge, this work does not rely on learning strategies but rather focuses on the capabilities of interactive systems to provide users with the necessary means to reflect on their proficiency, such as showing calculated text difficulty to a newspaper editor or visualizing muscle activity to a passionate sportsperson. We first elaborate on how proficiency can be detected and quantified in the context of interactive systems using physiological sensing technologies. Through developing interaction scenarios, we demonstrate the feasibility of gaze- and electromyography-based proficiency-aware systems by utilizing machine learning algorithms that can estimate users' proficiency levels for stationary vision-dominant tasks (reading, information intake) and dynamic manual tasks (playing instruments, fitness exercises). Secondly, we show how to facilitate proficiency awareness for users, including design challenges on when and how to communicate proficiency. We complement this second part by highlighting the necessity of toolkits for sensing modalities to enable the implementation of proficiency-aware systems for a wide audience. In this thesis, we contribute a definition of proficiency-aware systems, which we illustrate by designing and implementing interactive systems. We derive technical requirements for real-time, objective proficiency assessment and identify design qualities of communicating proficiency through user reflection. We summarize our findings in a set of design and engineering guidelines for proficiency awareness in interactive systems, highlighting that proficiency feedback makes performance interpretable for the user.In einer zunehmend digitalen Welt schaffen technologische Entwicklungen - wie datengesteuerte Algorithmen und kontextabhängige Anwendungen - neuartige Interaktionsmöglichkeiten mit digitalen Geräten. Jedoch verlassen sich Nutzer oftmals auf die Intelligenz dieser Systeme, ohne dabei selbst auf eine persönliche Weiterentwicklung hinzuwirken. Wird ein solches Vorgehen angestrebt, verlangt dies seitens der Anwender eine erhöhte Aufmerksamkeit. Es ist daher herausfordernd, ein entsprechendes Design für Kompetenzbewusstsein (Proficiency Awareness) zu etablieren. Das primäre Ziel dieser Arbeit ist es, eine Methodik für das Design und die Implementierung von interaktiven Systemen aufzustellen, die Nutzer dabei unterstützen über ihre eigene Kompetenz zu reflektieren, um dadurch Lerneffekte implizit wahrnehmen können. Diese Arbeit stellt ein Konzept für fähigkeitsbewusste Systeme (proficiency-aware systems) vor, welche die Fähigkeiten von Nutzern abschätzen, die Interaktion entsprechend anpassen sowie das Bewusstsein der Nutzer über deren Fähigkeiten fördern. Hierzu sollten die Systeme gesammelte Daten von Nutzern einsetzen, um Lerneffekte sichtbar zu machen. Die Möglichkeit der Anwender zur Selbstreflexion ist hierbei als entscheidend anzusehen, um als Motivation zur Verbesserung der eigenen Fähigkeiten zu dienen. Eine zentrale Herausforderung solcher Systeme ist die Tatsache, dass Nutzer - im Vergleich zur Abschätzung des Systems - oft eine divergierende Selbstwahrnehmung ihrer Kompetenz haben. Im ersten Moment sind daher die Vorteile einer persönlichen Weiterentwicklung nicht unbedingt ersichtlich. Daher baut diese Forschungsarbeit nicht darauf auf, Nutzer über vorgegebene Lernstrategien zu unterrichten, sondern sie bedient sich der Möglichkeiten interaktiver Systeme, die Anwendern die notwendigen Hilfsmittel zur Verfügung stellen, damit diese selbst über ihre Fähigkeiten reflektieren können. Einem Zeitungseditor könnte beispielsweise die aktuelle Textschwierigkeit angezeigt werden, während einem passionierten Sportler dessen Muskelaktivität veranschaulicht wird. Zunächst wird herausgearbeitet, wie sich die Fähigkeiten der Nutzer mittels physiologischer Sensortechnologien erkennen und quantifizieren lassen. Die Evaluation von Interaktionsszenarien demonstriert die Umsetzbarkeit fähigkeitsbewusster Systeme, basierend auf der Analyse von Blickbewegungen und Muskelaktivität. Hierbei kommen Algorithmen des maschinellen Lernens zum Einsatz, die das Leistungsniveau der Anwender für verschiedene Tätigkeiten berechnen. Im Besonderen analysieren wir stationäre Aktivitäten, die hauptsächlich den Sehsinn ansprechen (Lesen, Aufnahme von Informationen), sowie dynamische Betätigungen, die die Motorik der Nutzer fordern (Spielen von Instrumenten, Fitnessübungen). Der zweite Teil zeigt auf, wie Systeme das Bewusstsein der Anwender für deren eigene Fähigkeiten fördern können, einschließlich der Designherausforderungen , wann und wie das System erkannte Fähigkeiten kommunizieren sollte. Abschließend wird die Notwendigkeit von Toolkits für Sensortechnologien hervorgehoben, um die Implementierung derartiger Systeme für ein breites Publikum zu ermöglichen. Die Forschungsarbeit beinhaltet eine Definition für fähigkeitsbewusste Systeme und veranschaulicht dieses Konzept durch den Entwurf und die Implementierung interaktiver Systeme. Ferner werden technische Anforderungen objektiver Echtzeitabschätzung von Nutzerfähigkeiten erforscht und Designqualitäten für die Kommunikation dieser Abschätzungen mittels Selbstreflexion identifiziert. Zusammengefasst sind die Erkenntnisse in einer Reihe von Design- und Entwicklungsrichtlinien für derartige Systeme. Insbesondere die Kommunikation, der vom System erkannten Kompetenz, hilft Anwendern, die eigene Leistung zu interpretieren

    Creepy Technology: What Is It and How Do You Measure It?

    Get PDF
    Interactive technologies are getting closer to our bodies and permeate the infrastructure of our homes. While such technologies offer many benefits, they can also cause an initial feeling of unease in users. It is important for Human-Computer Interaction to manage first impressions and avoid designing technologies that appear creepy. To that end, we developed the Perceived Creepiness of Technology Scale (PCTS), which measures how creepy a technology appears to a user in an initial encounter with a new artefact. The scale was developed based on past work on creepiness and a set of ten focus groups conducted with users from diverse backgrounds. We followed a structured process of analytically developing and validating the scale. The PCTS is designed to enable designers and researchers to quickly compare interactive technologies and ensure that they do not design technologies that produce initial feelings of creepiness in users.Comment: 13 page

    Unreflected Acceptance -- Investigating the Negative Consequences of ChatGPT-Assisted Problem Solving in Physics Education

    Full text link
    Large language models (LLMs) have recently gained popularity. However, the impact of their general availability through ChatGPT on sensitive areas of everyday life, such as education, remains unclear. Nevertheless, the societal impact on established educational methods is already being experienced by both students and educators. Our work focuses on higher physics education and examines problem solving strategies. In a study, students with a background in physics were assigned to solve physics exercises, with one group having access to an internet search engine (N=12) and the other group being allowed to use ChatGPT (N=27). We evaluated their performance, strategies, and interaction with the provided tools. Our results showed that nearly half of the solutions provided with the support of ChatGPT were mistakenly assumed to be correct by the students, indicating that they overly trusted ChatGPT even in their field of expertise. Likewise, in 42% of cases, students used copy & paste to query ChatGPT -- an approach only used in 4% of search engine queries -- highlighting the stark differences in interaction behavior between the groups and indicating limited reflection when using ChatGPT. In our work, we demonstrated a need to (1) guide students on how to interact with LLMs and (2) create awareness of potential shortcomings for users.Comment: Pre-print currently under revie

    Intelligent Music Interfaces: When Interactive Assistance and Augmentation Meet Musical Instruments

    Get PDF
    The interactive augmentation of musical instruments to foster self-expressiveness and learning has a rich history. Over the past decades, the incorporation of interactive technologies into musical instruments emerged into a new research field requiring strong collaboration between different disciplines. The workshop "Intelligent Music Interfaces"consequently covers a wide range of musical research subjects and directions, including (a) current challenges in musical learning, (b) prototyping for improvements, (c) new means of musical expression, and (d) evaluation of the solutions

    SAFER: Development and Evaluation of an IoT Device Risk Assessment Framework in a Multinational Organization

    Full text link
    Users of Internet of Things (IoT) devices are often unaware of their security risks and cannot sufficiently factor security considerations into their device selection. This puts networks, infrastructure and users at risk. We developed and evaluated SAFER, an IoT device risk assessment framework designed to improve users' ability to assess the security of connected devices. We deployed SAFER in a large multinational organization that permits use of private devices. To evaluate the framework, we conducted a mixed-method study with 20 employees. Our findings suggest that SAFER increases users' awareness of security issues. It provides valuable advice and impacts device selection. Based on our findings, we discuss implications for the design of device risk assessment tools, with particular regard to the relationship between risk communication and user perceptions of device complexity

    Quantifying Meaningful Interaction: Developing the Eudaimonic Technology Experience Scale

    Get PDF
    Recent research has shown that users increasingly seek meaning in technologies and that eudaimonic user experience (UX) is part of everyday encounters with technology. Yet, to date, there is no validated means to assess eudaimonic properties in interactive artefacts. We conceptualised, developed and validated a six-item questionnaire for measuring eudaimonic properties of technologies—the Eudaimonic Technology Experience Scale (ETES). Our scale includes two factors, which describe what aspects of a eudaimonic experience can be supported by technology: eudaimonic goals and self-knowlege. We consulted work in Human-Computer Interaction (HCI), psychology and philosophy to gather an initial set of concepts that could contribute to eudaimonic UX. We then built the scale based on expert interviews and exploratory factor analysis and verified its quality in a number of tests (confirmatory factor analysis, reliability and validity checks). ETES provides a standardised tool for identifying eudaimonic qualities in interactive systems and allows for rapidly comparing prototypes

    Opportunities and applications of ultrasound sensing on unmodified consumer-grade smartphones

    No full text
    A person's smartphone is a cornucopia of information. Be it personal data extracted from contacts and calendar entries or the current location via GPS. The huge variety of sensors in today's mobile phones makes these devices a prime target for human activity recognition. The smartphone is no longer solely seen as actuator in smart environments, enabling the user to control auxiliary devices and sensors, but can now play a vital part in the network of sensing information itself. Especially in the area of human activity recognition, camera-based or body-worn systems are predominant. While they achieve high accuracy, these methods often suffer from privacy issues or obtrusiveness and consequently social stigma. In this thesis, I present an unobtrusive approach to perceive the vicinity surrounding the phone by leveraging the properties of ultrasound sensing. The device emits ultrasonic waves via its speaker and records the echo via the microphone. By analyzing the received signal, I can deduct certain movements, e.g. gestures performed above the phone, but also more complex motions involving the whole body of the user. I outline various experiments to estimate the feasibility of ultrasound sensing in different scenarios as well as propose an algorithm and mobile application that can classify given gestures and activities performed by the user. The system is able to recognize predefined gestures with an overall accuracy of 81% over six different users and can detect human activities up to 2m away

    I Know What You Want: Using Gaze Metrics to Predict Personal Interest

    No full text
    In daily communications, we often use interpersonal cues - telltale facial expressions and body language - to moderate responses to our conversation partners. While we are able to interpret gaze as a sign of interest or reluctance, conventional user interfaces do not yet possess this possible benefit. In our work, we evaluate to what degree fixation-based gaze metrics can be used to infer a user's personal interest in the displayed content. We report on a study (N=18) where participants were presented with a grid array of different images, whilst being recorded for gaze behavior. Our system calculated a ranking for shown images based on gaze metrics. We found that all metrics are effective indicators of the participants' interest by analyzing their agreement with regard to the system's ranking. In an evaluation in a museum, we found that this translates to in-the-wild scenarios despite environmental constraints, such as limited data accuracy
    corecore